AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient text understanding

# Efficient text understanding

Multi2convai Logistics Tr Bert
MIT
This is a Bert model optimized for Turkish, specifically designed for text classification tasks in the logistics domain.
Text Classification Transformers Other
M
inovex
119
0
Distilbert Mlm 250k
DistilBERT is a lightweight distilled version of BERT, retaining most of BERT's performance but with fewer parameters and faster inference speed.
Large Language Model Transformers
D
vocab-transformers
17
0
Roberta Base 100M 1
A RoBERTa base model pre-trained on 1B tokens with a validation perplexity of 3.93, suitable for English text processing tasks.
Large Language Model
R
nyu-mll
63
0
Albert Small V2
ALBERT Small v2 is a 6-layer lightweight version of ALBERT-base-v2, based on the Transformer architecture, suitable for natural language processing tasks.
Large Language Model Transformers
A
nreimers
62
0
Roberta Med Small 1M 1
A RoBERTa model pretrained on a small-scale dataset of 1M tokens, using the MED-SMALL architecture, suitable for text understanding tasks.
Large Language Model
R
nyu-mll
23
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase